Ask AI

Airbyte (dagster-airbyte)

This library provides a Dagster integration with Airbyte.

For more information on getting started, see the Airbyte integration guide.

Resources

dagster_airbyte.AirbyteResource ResourceDefinition[source]

Config Schema:
request_max_retries (Union[dagster.IntSource, None], optional):

The maximum number of times requests to the Airbyte API should be retried before failing.

Default Value: 3

request_retry_delay (Union[Float, None], optional):

Time (in seconds) to wait between each request retry.

Default Value: 0.25

request_timeout (Union[dagster.IntSource, None], optional):

Time (in seconds) after which the requests to Airbyte are declared timed out.

Default Value: 15

cancel_sync_on_run_termination (Union[dagster.BoolSource, None], optional):

Whether to cancel a sync in Airbyte if the Dagster runner is terminated. This may be useful to disable if using Airbyte sources that cannot be cancelled and resumed easily, or if your Dagster deployment may experience runner interruptions that do not impact your Airbyte deployment.

Default Value: True

poll_interval (Union[Float, None], optional):

Time (in seconds) to wait between checking a sync’s status.

Default Value: 10

host (dagster.StringSource):

The Airbyte server address.

port (dagster.StringSource):

Port used for the Airbyte server.

username (Union[dagster.StringSource, None], optional):

Username if using basic auth.

Default Value: None

password (Union[dagster.StringSource, None], optional):

Password if using basic auth.

Default Value: None

use_https (Union[dagster.BoolSource, None], optional):

Whether to use HTTPS to connect to the Airbyte server.

Default Value: False

forward_logs (Union[dagster.BoolSource, None], optional):

Whether to forward Airbyte logs to the compute log, can be expensive for long-running syncs.

Default Value: True

request_additional_params (Union[dict, None], optional):

Any additional kwargs to pass to the requests library when making requests to Airbyte.

Default Value:
{}

This resource allows users to programatically interface with the Airbyte REST API to launch syncs and monitor their progress.

Examples:

from dagster import job, EnvVar
from dagster_airbyte import AirbyteResource

my_airbyte_resource = AirbyteResource(
    host=EnvVar("AIRBYTE_HOST"),
    port=EnvVar("AIRBYTE_PORT"),
    # If using basic auth
    username=EnvVar("AIRBYTE_USERNAME"),
    password=EnvVar("AIRBYTE_PASSWORD"),
)

airbyte_assets = build_airbyte_assets(
    connection_id="87b7fe85-a22c-420e-8d74-b30e7ede77df",
    destination_tables=["releases", "tags", "teams"],
)

defs = Definitions(
    assets=[airbyte_assets],
    resources={"airbyte": my_airbyte_resource},
)

Assets

dagster_airbyte.load_assets_from_airbyte_instance(airbyte, workspace_id=None, key_prefix=None, create_assets_for_normalization_tables=True, connection_to_group_fn=<function _clean_name>, connection_meta_to_group_fn=None, io_manager_key=None, connection_to_io_manager_key_fn=None, connection_filter=None, connection_to_asset_key_fn=None, connection_to_freshness_policy_fn=None, connection_to_auto_materialize_policy_fn=None)[source]

Loads Airbyte connection assets from a configured AirbyteResource instance. This fetches information about defined connections at initialization time, and will error on workspace load if the Airbyte instance is not reachable.

Parameters:
  • airbyte (ResourceDefinition) – An AirbyteResource configured with the appropriate connection details.

  • workspace_id (Optional[str]) – The ID of the Airbyte workspace to load connections from. Only required if multiple workspaces exist in your instance.

  • key_prefix (Optional[CoercibleToAssetKeyPrefix]) – A prefix for the asset keys created.

  • create_assets_for_normalization_tables (bool) – If True, assets will be created for tables created by Airbyte’s normalization feature. If False, only the destination tables will be created. Defaults to True.

  • connection_to_group_fn (Optional[Callable[[str], Optional[str]]]) – Function which returns an asset group name for a given Airbyte connection name. If None, no groups will be created. Defaults to a basic sanitization function.

  • connection_meta_to_group_fn (Optional[Callable[[AirbyteConnectionMetadata], Optional[str]]]) – Function which returns an asset group name for a given Airbyte connection metadata. If None and connection_to_group_fn is None, no groups will be created

  • io_manager_key (Optional[str]) – The I/O manager key to use for all assets. Defaults to “io_manager”. Use this if all assets should be loaded from the same source, otherwise use connection_to_io_manager_key_fn.

  • connection_to_io_manager_key_fn (Optional[Callable[[str], Optional[str]]]) – Function which returns an I/O manager key for a given Airbyte connection name. When other ops are downstream of the loaded assets, the IOManager specified determines how the inputs to those ops are loaded. Defaults to “io_manager”.

  • connection_filter (Optional[Callable[[AirbyteConnectionMetadata], bool]]) – Optional function which takes in connection metadata and returns False if the connection should be excluded from the output assets.

  • connection_to_asset_key_fn (Optional[Callable[[AirbyteConnectionMetadata, str], AssetKey]]) – Optional function which takes in connection metadata and table name and returns an asset key for the table. If None, the default asset key is based on the table name. Any asset key prefix will be applied to the output of this function.

  • connection_to_freshness_policy_fn (Optional[Callable[[AirbyteConnectionMetadata], Optional[FreshnessPolicy]]]) – Optional function which takes in connection metadata and returns a freshness policy for the connection’s assets. If None, no freshness policies will be applied to the assets.

  • connection_to_auto_materialize_policy_fn (Optional[Callable[[AirbyteConnectionMetadata], Optional[AutoMaterializePolicy]]]) – Optional function which takes in connection metadata and returns an auto materialization policy for the connection’s assets. If None, no auto materialization policies will be applied to the assets.

Examples:

Loading all Airbyte connections as assets:

from dagster_airbyte import airbyte_resource, load_assets_from_airbyte_instance

airbyte_instance = airbyte_resource.configured(
    {
        "host": "localhost",
        "port": "8000",
    }
)
airbyte_assets = load_assets_from_airbyte_instance(airbyte_instance)

Filtering the set of loaded connections:

from dagster_airbyte import airbyte_resource, load_assets_from_airbyte_instance

airbyte_instance = airbyte_resource.configured(
    {
        "host": "localhost",
        "port": "8000",
    }
)
airbyte_assets = load_assets_from_airbyte_instance(
    airbyte_instance,
    connection_filter=lambda meta: "snowflake" in meta.name,
)
dagster_airbyte.build_airbyte_assets(connection_id, destination_tables, destination_database=None, destination_schema=None, asset_key_prefix=None, group_name=None, normalization_tables=None, deps=None, upstream_assets=None, schema_by_table_name=None, freshness_policy=None, stream_to_asset_map=None, auto_materialize_policy=None)[source]

Builds a set of assets representing the tables created by an Airbyte sync operation.

Parameters:
  • connection_id (str) – The Airbyte Connection ID that this op will sync. You can retrieve this value from the “Connections” tab of a given connector in the Airbyte UI.

  • destination_tables (List[str]) – The names of the tables that you want to be represented in the Dagster asset graph for this sync. This will generally map to the name of the stream in Airbyte, unless a stream prefix has been specified in Airbyte.

  • destination_database (Optional[str]) – The name of the destination database.

  • destination_schema (Optional[str]) – The name of the destination schema.

  • normalization_tables (Optional[Mapping[str, List[str]]]) – If you are using Airbyte’s normalization feature, you may specify a mapping of destination table to a list of derived tables that will be created by the normalization process.

  • asset_key_prefix (Optional[List[str]]) – A prefix for the asset keys inside this asset. If left blank, assets will have a key of AssetKey([table_name]).

  • deps (Optional[Sequence[Union[AssetsDefinition, SourceAsset, str, AssetKey]]]) – A list of assets to add as sources.

  • upstream_assets (Optional[Set[AssetKey]]) – Deprecated, use deps instead. A list of assets to add as sources.

  • freshness_policy (Optional[FreshnessPolicy]) – A freshness policy to apply to the assets

  • stream_to_asset_map (Optional[Mapping[str, str]]) – A mapping of an Airbyte stream name to a Dagster asset. This allows the use of the “prefix” setting in Airbyte with special characters that aren’t valid asset names.

  • auto_materialize_policy (Optional[AutoMaterializePolicy]) – An auto materialization policy to apply to the assets.

Ops

dagster_airbyte.airbyte_sync_op = <dagster._core.definitions.op_definition.OpDefinition object>[source]

Config Schema:
connection_id (dagster.StringSource):

Parsed json dictionary representing the details of the Airbyte connector after the sync successfully completes. See the [Airbyte API Docs](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#overview) to see detailed information on this response.

poll_interval (Union[Float, None], optional):

The maximum time that will waited before this operation is timed out. By default, this will never time out.

Default Value: 10

poll_timeout (Union[Float, None], optional):

The maximum time that will waited before this operation is timed out. By default, this will never time out.

Default Value: None

yield_materializations (Union[dagster.BoolSource, None], optional):

If True, materializations corresponding to the results of the Airbyte sync will be yielded when the op executes.

Default Value: True

asset_key_prefix (Union[List[dagster.StringSource], None], optional):

If provided and yield_materializations is True, these components will be used to prefix the generated asset keys.

Default Value: [‘airbyte’]

Executes a Airbyte job sync for a given connection_id, and polls until that sync completes, raising an error if it is unsuccessful. It outputs a AirbyteOutput which contains the job details for a given connection_id.

It requires the use of the airbyte_resource, which allows it to communicate with the Airbyte API.

Examples

from dagster import job
from dagster_airbyte import airbyte_resource, airbyte_sync_op

my_airbyte_resource = airbyte_resource.configured(
    {
        "host": {"env": "AIRBYTE_HOST"},
        "port": {"env": "AIRBYTE_PORT"},
    }
)

sync_foobar = airbyte_sync_op.configured({"connection_id": "foobar"}, name="sync_foobar")

@job(resource_defs={"airbyte": my_airbyte_resource})
def my_simple_airbyte_job():
    sync_foobar()

@job(resource_defs={"airbyte": my_airbyte_resource})
def my_composed_airbyte_job():
    final_foobar_state = sync_foobar(start_after=some_op())
    other_op(final_foobar_state)